Parallel and Communication Avoiding Least Angle Regression
نویسندگان
چکیده
We are interested in parallelizing the least angle regression (LARS) algorithm for fitting linear models to high-dimensional data. consider two parallel and communication avoiding versions of basic LARS algorithm. The algorithms have different asymptotic costs practical performance. One offers more speedup other produces accurate output. first is bLARS, a block version algorithm, where we update $b$ columns at each iteration. Assuming that data row-partitioned, bLARS reduces number arithmetic operations, latency, bandwidth by factor $b$. second tournament-bLARS (T-bLARS), tournament processors compete running several computations choose new be added solution. column-partitioned, T-bLARS latency Similarly LARS, our proposed methods generate sequence models. present extensive numerical experiments illustrate speedups up 4x compared without any compromise solution quality.
منابع مشابه
Least Angle Regression
The purpose of model selection algorithms such as All Subsets, Forward Selection, and Backward Elimination is to choose a linear model on the basis of the same set of data to which the model will be applied. Typically we have available a large collection of possible covariates from which we hope to select a parsimonious set for the efficient prediction of a response variable. Least Angle Regres...
متن کاملDiscussion of “ Least Angle Regression ”
Being able to reliably, and automatically, select variables in linear regression models is a notoriously difficult problem. This research attacks this question head on, introducing not only a computationally efficient algorithm and method, LARS (and its derivatives), but at the same time introducing comprehensive theory explaining the intricate details of the procedure as well as theory to guid...
متن کاملDiscussion of Least Angle Regression
Algorithms for simultaneous shrinkage and selection in regression and classification provide attractive solutions to knotty old statistical challenges. Nevertheless, as far as we can tell, Tibshirani’s Lasso algorithm has had little impact on statistical practice. Two particular reasons for this may be the relative inefficiency of the original Lasso algorithm, and the relative complexity of mor...
متن کاملRobust groupwise least angle regression
Many regression problems exhibit a natural grouping among predictor variables. Examples are groups of dummy variables representing categorical variables, or present and lagged values of time series data. Since model selection in such cases typically aims for selecting groups of variables rather than individual covariates, an extension of the popular least angle regression (LARS) procedure to gr...
متن کاملLeast Angle and L1 Regression: A Review
Least Angle Regression is a promising technique for variable selection applications, offering a nice alternative to stepwise regression. It provides an explanation for the similar behavior of LASSO (L1-penalized regression) and forward stagewise regression, and provides a fast implementation of both. The idea has caught on rapidly, and sparked a great deal of research interest. In this paper, w...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: SIAM Journal on Scientific Computing
سال: 2021
ISSN: ['1095-7197', '1064-8275']
DOI: https://doi.org/10.1137/19m1305720